Bayesian Learning in Undirected Graphical Models: Approximate MCMC Algorithms
نویسندگان
چکیده
Bayesian learning in undirected graphical models—computing posterior distributions over parameters and predictive quantities— is exceptionally difficult. We conjecture that for general undirected models, there are no tractable MCMC (Markov Chain Monte Carlo) schemes giving the correct equilibrium distribution over parameters. While this intractability, due to the partition function, is familiar to those performing parameter optimisation, Bayesian learning of posterior distributions over undirected model parameters has been unexplored and poses novel challenges. We propose several approximate MCMC schemes and test on fully observed binary models (Boltzmann machines) for a small coronary heart disease data set and larger artificial systems. While approximations must perform well on the model, their interaction with the sampling scheme is also important. Samplers based on variational mean-field approximations generally performed poorly, more advanced methods using loopy propagation, brief sampling and stochastic dynamics lead to acceptable parameter posteriors. Finally, we demonstrate these techniques on a Markov random field with hidden variables.
منابع مشابه
An Introduction to Restricted Boltzmann Machines
Restricted Boltzmann machines (RBMs) are probabilistic graphical models that can be interpreted as stochastic neural networks. The increase in computational power and the development of faster learning algorithms have made them applicable to relevant machine learning problems. They attracted much attention recently after being proposed as building blocks of multi-layer learning systems called d...
متن کاملParticle Filtered MCMC-MLE with Connections to Contrastive Divergence
Learning undirected graphical models such as Markov random fields is an important machine learning task with applications in many domains. Since it is usually intractable to learn these models exactly, various approximate learning techniques have been developed, such as contrastive divergence (CD) and Markov chain Monte Carlo maximum likelihood estimation (MCMC-MLE). In this paper, we introduce...
متن کاملTraining restricted Boltzmann machines: An introduction
Restricted Boltzmann machines (RBMs) are probabilistic graphical models that can be interpreted as stochastic neural networks. They have attracted much attention as building blocks for the multi-layer learning systems called deep belief networks, and variants and extensions of RBMs have found application in a wide range of pattern recognition tasks. This tutorial introduces RBMs from the viewpo...
متن کاملlibDAI: A Free and Open Source C++ Library for Discrete Approximate Inference in Graphical Models
This paper describes the software package libDAI, a free & open source C++ library that provides implementations of various exact and approximate inference methods for graphical models with discrete-valued variables. libDAI supports directed graphical models (Bayesian networks) as well as undirected ones (Markov random fields and factor graphs). It offers various approximations of the partition...
متن کاملA conditional independence algorithm for learning undirected graphical models
When it comes to learning graphical models from data, approaches based on conditional independence tests are among the most popular methods. Since Bayesian networks dominate research in this field, these methods usually refer to directed graphs, and thus have to determine not only the set of edges, but also their direction. At least for a certain kind of possibilistic graphical models, however,...
متن کامل